35. Batch vs Stochastic Gradient Descent
Batch Vs Stochastic Gradient Descent
Stochastic Gradient Descent in Keras
Stochastic gradient descent is very easy to implement in Keras. All we need to do is specify the size of the batches in the training process, as follows:
model.fit(X_train, y_train, epochs=1000, batch_size=100, verbose=0)
In here, we are splitting our data in batches of 100.